Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Deep mixed convolution model for pulmonary nodule detection
QI Yongjun, GU Junhua, ZHANG Yajuan, WANG Feng, TIAN Zepei
Journal of Computer Applications    2020, 40 (10): 2904-2909.   DOI: 10.11772/j.issn.1001-9081.2020020192
Abstract404)      PDF (1572KB)(668)       Save
Pulmonary nodule detection is a very challenging task based on high-dimensional lung Computed Tomography (CT) images. Among many pulmonary nodule detection algorithms, the deep Convolutional Neural Network (CNN) is the most attractive one. In this kind of networks, the Two-Dimensional (2D) CNNs with many pre-trained models and high detection efficiency are widely used. However, the nature of pulmonary nodules is the Three-Dimensional (3D) lesion, so that the 2D CNNs will inevitably cause information loss and thereby affect the detection accuracy. The 3D CNNs can make full use of the spatial information of CT images and effectively improve the detection accuracy, but the 3D CNNs have shortcomings such as many parameters, large calculation consumption and high risk of over fitting. In order to take the advantages of the two networks, a pulmonary nodule detection model based on a deep mixed CNN was proposed. By deploying 3D CNN in the shallow layer of the neural network model and 2D CNN in the deep layer of the model, and adding a deconvolution module to fuse multi-layer image features together, the model parameters were reduced and the generalization ability and the detection efficiency of the model were improved without decreasing the detection accuracy. Experimental results on LUNA16 dataset show that the proposed model has the sensitivity reached 0.924 under the condition of average 8 false positives per scan, which outperforms the existing state-of-the-art models.
Reference | Related Articles | Metrics
Medical image registration by integrating modified brain storm optimization algorithm and Powell algorithm
LIANG Zhigang, GU Junhua
Journal of Computer Applications    2018, 38 (9): 2683-2688.   DOI: 10.11772/j.issn.1001-9081.2018020353
Abstract601)      PDF (1087KB)(495)       Save
Aiming at the problems of poor accuracy, easy to fall into local maximum and slow convergence in existing medical image registration methods, based on multi-resolution analysis, a hybrid algorithm of Modified Brain Storm Optimization (MBSO) and Powell algorithm was proposed. MBSO algorithm, the proportion of individuals participating in local and global search was adjusted by changing the way of individual generation, and variable step size was adopted to enhance search ability, to achieve the purpose of accelerating convergence and jumping out of local optimum. Firstly, the MBSO algorithm was used to search globally in the low resolution layer. Then the result was used as the start point of Powell algorithm to search in the high resolution layer. Finally, Powell algorithm was used to search and locate the globally optimal value in the original image layer. Compared with the Particle Swarm Optimization (PSO) algorithm, Ant Colony Optimization (ACO) algorithm, Genetic Algorithm (GA) combined with Powell algorithm, the average root mean square error of the proposed algorithm decreased by 20.89%, 30.46% and 18.54%, and the average registration time reduced by 17.86%, 27.05% and 26.60% with success rate of 100%. The experimental results show that the proposed algorithm has good robustness and can accomplish the medical image registration task quickly and accurately.
Reference | Related Articles | Metrics
Fast label propagation algorithm based on node centrality and community similarity
GU Junhua, HUO Shijie, WANG Shoubin, TIAN Zhe
Journal of Computer Applications    2018, 38 (5): 1320-1326.   DOI: 10.11772/j.issn.1001-9081.2017102927
Abstract493)      PDF (1061KB)(506)       Save
In order to reduce unnecessary update and solve the problem of low accuracy and poor stability of Label Propagation Algorithm (LPA), a Fast Label Propagation Algorithm based on Node Centrality and Community Similarity (FNCS_LPA) was proposed. According to the node centrality measure, the nodes of a network were sorted from low to high and added into node information list, which guided the update process to avoid unnecessary update and improve the stability of community detection. The accuracy of community detection was improved by a new update rule based on community similarity. Experiments were tested on a real social networks and LFR benchmarks. Compared with LPA and three improved LPA algorithms, the execution speed is improved by almost a dozen times, the modularities of the real social networks and the Normalized Mutual Information (NMI) of LFR (Lancichinetti Fortunato Radicchi) benchmark networks with more obscure community structure were significantly improved. The experimental results show that FNCS_LPA improves the accuracy and stability of community detection on the basis of improving execution speed.
Reference | Related Articles | Metrics
Parallel multi-layer graph partitioning method for solving maximum clique problems
GU Junhua, HUO Shijie, WU Junyan, YIN Jun, ZHANG Suqi
Journal of Computer Applications    2018, 38 (12): 3425-3432.   DOI: 10.11772/j.issn.1001-9081.2018040934
Abstract573)      PDF (1254KB)(346)       Save
In big data environment, the mass of nodes in graph and the complexity of analysis bring forward higher requirement for the speed and accuracy of maximum clique problems. In order to solve the problems, a Parallel Multi-layer Graph Partitioning method for Solving Maximum Clique (PMGP_SMC) was proposed. Firstly, a new Multi-layer Graph Partitioning method (MGP) was proposed, the large-scale graph partitioning was executed to generate subgraphs while the clique structure of the original graph was maintained and not destroyed. Large-scale subgraphs were divided into multi-level graphs to further reduce the size of subgraphs. The graph computing framework of GraphX was used to achieve MGP to form a Parallel Multi-layer Graph Partitioning (PMGP) method. Then, according to the size of partitioned subgraph, the iteration number of Local Search algorithm Based on Penalty value (PBLS) was reduced, and the PBLS based on Speed optimization (SPBLS) was proposed to solve the maximum clique of each subgraph. Finally, PMGP method and SPBLS were combined to form PMGP_SMC. The large-scale dataset of Stanford was used for running test. The experimental results show that, the proposed PMGP reduces the maximum subgraph size by more than 100 times and the average subgraph size by 2 times compared with Parallel Single Graph Partitioning method (PSGP). Compared with PSGP for Solving Maximum Clique (PSGP_SMC), the proposed PMGP_SMC reduces the overall time by about 100 times, and its accuracy is consistent with that of Parallel Multi-layer Graph Partitioning for solving maximum clique based on Maximal Clique Enumeration (PMGP_MCE) in solving the maximum clique. The proposed PMGP_SMC can solve the maximum clique of large-scale graph quickly and accurately.
Reference | Related Articles | Metrics
Optimization and implementation of parallel FP-Growth algorithm based on Spark
GU Junhua, WU Junyan, XU Xinyun, XIE Zhijian, ZHANG Suqi
Journal of Computer Applications    2018, 38 (11): 3069-3074.   DOI: 10.11772/j.issn.1001-9081.2018041219
Abstract972)      PDF (928KB)(635)       Save
In order to further improve the execution efficiency of Frequent Pattern-Growth (FP-Growth) algorithm on Spark platform, a new parallel FP-Growth algorithm based on Spark, named BFPG (Better Frequent Pattern-Growth), was presented. Firstly, the grouping strategy F-List was improved in the size of the Frequent Pattern-Tree (FP-Tree) and the amount of partition calculation to ensure that the load sum of each partition was approximately equal. Secondly, the data set partitioning strategy was optimized by creating a list P-List, and then the time complexity was reduced by reducing the traversal times. The experimental results show that the BFPG algorithm improves the mining efficiency of the parallel FP-Growth algorithm, and the algorithm has good scalability.
Reference | Related Articles | Metrics
Multi-robot odor source localization based on brain storm optimization algorithm
LIANG Zhigang, GU Junhua, DONG Yongfeng
Journal of Computer Applications    2017, 37 (12): 3614-3619.   DOI: 10.11772/j.issn.1001-9081.2017.12.3614
Abstract498)      PDF (1048KB)(660)       Save
Aiming at the problems of the odor source localization algorithms by using multi-robot in indoor turbulent environment, such as the low utilization rate of historical concentration information and the lack of mechanism to adjust the global and local search, a multi-robot cooperative search algorithm combing Brain Storm Optimization (BSO) algorithm and upwind search was proposed. Firstly, the searched location of robot was initialized as an individual and the robot position was taken as the center for clustering, which effectively used the guiding role of historical information. Secondly, the upwind search was defined as an individual mutation operation to dynamically adjust the number of new individuals generated by the fusion of selected individuals in a class or two classes, which effectively adjusted the global and local search methods. Finally, the odor source was confirmed according to the two indexes of concentration and persistence. In the simulation experiments under two environments with and without obstacles, the proposed algorithm was compared with three kinds of swarm intelligent multi-robot odor source localization algorithms. The experimental results show that, the average search time of the proposed algorithm is reduced by more than 33% and the location accuracy is 100%. The proposed algorithm can effectively adjust the global and local search relations of robot, and locate the odor source quickly and accurately.
Reference | Related Articles | Metrics
IPTV implicit scoring model based on Hadoop
GU Junhua, GUAN Lei, ZHANG Jian, GAO Xing, ZHANG Suqi
Journal of Computer Applications    2017, 37 (11): 3188-3193.   DOI: 10.11772/j.issn.1001-9081.2017.11.3188
Abstract539)      PDF (867KB)(456)       Save
According to the implicit characteristics of IPTV (Internet Protocol Television) user viewing behavior data, a novel implicit rating model was proposed. Firstly, the main features of IPTV user viewing behavior data were introduced, and a new mixed feature implicit scoring model was proposed, which combined with viewing ratio, user interest bias factor and video type influence factor. Secondly, the strategy of viewing behavior based on viewing time and viewing ratio was proposed. Finally, a distributed model architecture based on Hadoop was designed and implemented. The experimental results show that the proposed novel model effectively improves the quality of the recommended results in the IPTV system, improves the time efficiency, and has good scalability for large amounts of data.
Reference | Related Articles | Metrics
New self-localization method for indoor mobile robot
ZHOU Yancong, DONG Yongfeng, WANG Anna, GU Junhua
Journal of Computer Applications    2015, 35 (2): 585-589.   DOI: 10.11772/j.issn.1001-9081.2015.02.0585
Abstract570)      PDF (837KB)(448)       Save

Aiming at the problems of the current self-localization algorithms for indoor mobile robot, such as the low positioning accuracy, increasing positioning error with time, the signal's multipath effect and non-line-of-sight effect, a new mobile robot self-localization method based on Monte Carlo Localization (MCL) was proposed. Firstly, through analyzing the mobile robot self-localization system based on Radio Frequency IDentification (RFID), the robot motion model was established. Secondly, through the analysis of the mobile robot positioning system based on Received Signal Strength Indicator (RSSI), the observation model was put forward. Finally, in order to improve the computing efficiency of particle filter, the particle culling strategy and particle weight strategy considering orientation of the particles were given, to enhance the positioning accuracy and the execution efficiency of the new positioning system. The position errors of the new algorithm were about 3 cm in both the X direction and the Y direction, while position error of the traditional localization algorithm in the X direction and the Y direction were both about 6 cm. Simulation results show that the new algorithm doubles the positioning accuracy, and has good robustness.

Reference | Related Articles | Metrics
Continuous function optimization based on improved harmony search algorithm
LU Jing GU Junhua
Journal of Computer Applications    2014, 34 (1): 194-198.   DOI: 10.11772/j.issn.1001-9081.2014.01.0194
Abstract546)      PDF (698KB)(429)       Save
Concerning the difficulties in solving the continuous functions of general Harmony Search (HS) algorithm, an improved HS algorithm was proposed. With analogies to the concept of the simulated annealing algorithm, the way of updating parameter was redesigned. And it limited the number of identical harmonies stored in the harmony memory to increase the diversity of solutions. Simulation results of the proposed algorithm were compared with other HS approaches. The computational results reveal that the proposed algorithm is more effective in enhancing the solution quality and convergence speed than other HS approaches.
Related Articles | Metrics